Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 42
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Elife ; 132024 Mar 15.
Artigo em Inglês | MEDLINE | ID: mdl-38489224

RESUMO

How neural representations preserve information about multiple stimuli is mysterious. Because tuning of individual neurons is coarse (e.g., visual receptive field diameters can exceed perceptual resolution), the populations of neurons potentially responsive to each individual stimulus can overlap, raising the question of how information about each item might be segregated and preserved in the population. We recently reported evidence for a potential solution to this problem: when two stimuli were present, some neurons in the macaque visual cortical areas V1 and V4 exhibited fluctuating firing patterns, as if they responded to only one individual stimulus at a time (Jun et al., 2022). However, whether such an information encoding strategy is ubiquitous in the visual pathway and thus could constitute a general phenomenon remains unknown. Here, we provide new evidence that such fluctuating activity is also evoked by multiple stimuli in visual areas responsible for processing visual motion (middle temporal visual area, MT), and faces (middle fundus and anterolateral face patches in inferotemporal cortex - areas MF and AL), thus extending the scope of circumstances in which fluctuating activity is observed. Furthermore, consistent with our previous results in the early visual area V1, MT exhibits fluctuations between the representations of two stimuli when these form distinguishable objects but not when they fuse into one perceived object, suggesting that fluctuating activity patterns may underlie visual object formation. Taken together, these findings point toward an updated model of how the brain preserves sensory information about multiple stimuli for subsequent processing and behavioral action.


Assuntos
Córtex Visual , Vias Visuais , Vias Visuais/fisiologia , Córtex Visual/fisiologia , Campos Visuais , Neurônios/fisiologia , Estimulação Luminosa
2.
Proc Natl Acad Sci U S A ; 120(48): e2303562120, 2023 Nov 28.
Artigo em Inglês | MEDLINE | ID: mdl-37988462

RESUMO

Eye movements alter the relationship between the visual and auditory spatial scenes. Signals related to eye movements affect neural pathways from the ear through auditory cortex and beyond, but how these signals contribute to computing the locations of sounds with respect to the visual scene is poorly understood. Here, we evaluated the information contained in eye movement-related eardrum oscillations (EMREOs), pressure changes recorded in the ear canal that occur in conjunction with simultaneous eye movements. We show that EMREOs contain parametric information about horizontal and vertical eye displacement as well as initial/final eye position with respect to the head. The parametric information in the horizontal and vertical directions can be modeled as combining linearly, allowing accurate prediction of the EMREOs associated with oblique (diagonal) eye movements. Target location can also be inferred from the EMREO signals recorded during eye movements to those targets. We hypothesize that the (currently unknown) mechanism underlying EMREOs could impose a two-dimensional eye-movement-related transfer function on any incoming sound, permitting subsequent processing stages to compute the positions of sounds in relation to the visual scene.


Assuntos
Movimentos Oculares , Movimentos Sacádicos , Movimento , Fenômenos Fisiológicos Oculares , Som
3.
Hear Res ; 440: 108899, 2023 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-37979436

RESUMO

We recently discovered a unique type of otoacoustic emission (OAE) time-locked to the onset (and offset) of saccadic eye movements and occurring in the absence of external sound (Gruters et al., 2018). How and why these eye-movement-related eardrum oscillations (EMREOs) are generated is unknown, with a role in visual-auditory integration being the likeliest candidate. Clues to both the drivers of EMREOs and their purpose can be gleaned by examining responses in normal hearing human subjects. Do EMREOs occur in all individuals with normal hearing? If so, what components of the response occur most consistently? Understanding which attributes of EMREOs are similar across participants and which show more variability will provide the groundwork for future comparisons with individuals with hearing abnormalities affecting the ear's various motor components. Here we report that in subjects with normal hearing thresholds and normal middle ear function, all ears show (a) measurable EMREOs (mean: 58.7 dB SPL; range 45-67 dB SPL for large contralateral saccades), (b) a phase reversal for contra- versus ipsilaterally-directed saccades, (c) a large peak in the signal occurring soon after saccade onset, (d) an additional large peak time-locked to saccade offset and (e) evidence that saccade duration is encoded in the signal. We interpret the attributes of EMREOs that are most consistent across subjects as the ones that are most likely to play an essential role in their function. The individual differences likely reflect normal variation in individuals' auditory system anatomy and physiology, much like traditional measures of auditory function such as auditory-evoked OAEs, tympanometry and auditory-evoked potentials. Future work will compare subjects with different types of auditory dysfunction to population data from normal hearing subjects. Overall, these findings provide important context for the widespread observations of visual- and eye-movement related signals found in cortical and subcortical auditory areas of the brain.


Assuntos
Audição , Membrana Timpânica , Humanos , Audição/fisiologia , Emissões Otoacústicas Espontâneas/fisiologia , Testes de Impedância Acústica , Som
4.
Philos Trans R Soc Lond B Biol Sci ; 378(1886): 20220340, 2023 09 25.
Artigo em Inglês | MEDLINE | ID: mdl-37545299

RESUMO

Auditory and visual information involve different coordinate systems, with auditory spatial cues anchored to the head and visual spatial cues anchored to the eyes. Information about eye movements is therefore critical for reconciling visual and auditory spatial signals. The recent discovery of eye movement-related eardrum oscillations (EMREOs) suggests that this process could begin as early as the auditory periphery. How this reconciliation might happen remains poorly understood. Because humans and monkeys both have mobile eyes and therefore both must perform this shift of reference frames, comparison of the EMREO across species can provide insights to shared and therefore important parameters of the signal. Here we show that rhesus monkeys, like humans, have a consistent, significant EMREO signal that carries parametric information about eye displacement as well as onset times of eye movements. The dependence of the EMREO on the horizontal displacement of the eye is its most consistent feature, and is shared across behavioural tasks, subjects and species. Differences chiefly involve the waveform frequency (higher in monkeys than in humans) and patterns of individual variation (more prominent in monkeys than in humans), and the waveform of the EMREO when factors due to horizontal and vertical eye displacements were controlled for. This article is part of the theme issue 'Decision and control processes in multisensory perception'.


Assuntos
Movimentos Oculares , Membrana Timpânica , Humanos , Sinais (Psicologia) , Movimento
5.
bioRxiv ; 2023 Jul 19.
Artigo em Inglês | MEDLINE | ID: mdl-37502939

RESUMO

How neural representations preserve information about multiple stimuli is mysterious. Because tuning of individual neurons is coarse (for example, visual receptive field diameters can exceed perceptual resolution), the populations of neurons potentially responsive to each individual stimulus can overlap, raising the question of how information about each item might be segregated and preserved in the population. We recently reported evidence for a potential solution to this problem: when two stimuli were present, some neurons in the macaque visual cortical areas V1 and V4 exhibited fluctuating firing patterns, as if they responded to only one individual stimulus at a time. However, whether such an information encoding strategy is ubiquitous in the visual pathway and thus could constitute a general phenomenon remains unknown. Here we provide new evidence that such fluctuating activity is also evoked by multiple stimuli in visual areas responsible for processing visual motion (middle temporal visual area, MT), and faces (middle fundus and anterolateral face patches in inferotemporal cortex - areas MF and AL), thus extending the scope of circumstances in which fluctuating activity is observed. Furthermore, consistent with our previous results in the early visual area V1, MT exhibits fluctuations between the representations of two stimuli when these form distinguishable objects but not when they fuse into one perceived object, suggesting that fluctuating activity patterns may underlie visual object formation. Taken together, these findings point toward an updated model of how the brain preserves sensory information about multiple stimuli for subsequent processing and behavioral action. Impact Statement: We find neural fluctuations in multiple areas along the visual cortical hierarchy that could allow the brain to represent distinct co-occurring visual stimuli.

6.
bioRxiv ; 2023 Aug 06.
Artigo em Inglês | MEDLINE | ID: mdl-36945521

RESUMO

We recently discovered a unique type of low-frequency otoacoustic emission (OAE) time-locked to the onset (and offset) of saccadic eye movements and occurring in the absence of external sound (Gruters et al., 2018). How and why these eye-movement-related eardrum oscillations (EMREOs) are generated is unknown, with a role in visual-auditory integration being the likeliest candidate. Clues to both the drivers of EMREOs and their purpose can be gleaned by examining responses in normal hearing human subjects. Do EMREOs occur in all individuals with normal hearing? If so, what components of the response occur most consistently? Understanding which attributes of EMREOs are similar across participants and which show more variability will provide the groundwork for future comparisons with individuals with hearing abnormalities affecting the ear's various motor components. Here we report that in subjects with normal hearing thresholds and normal middle ear function, all ears show (a) measurable EMREOs (mean: 58.7 dB SPL; range 45-67 dB SPL for large contralateral saccades), (b) a phase reversal for contra- versus ipsilaterally-directed saccades, (c) a large peak in the signal occurring soon after saccade onset, (d) an additional large peak time-locked to saccade offset and (e) evidence that saccade duration is encoded in the signal. We interpret the attributes of EMREOs that are most consistent across subjects as the ones that are most likely to play an essential role in their function. The individual differences likely reflect normal variation in individuals' auditory system anatomy and physiology, much like traditional measures of auditory function such as auditory-evoked OAEs, tympanometry and auditory-evoked potentials. Future work will compare subjects with different types of auditory dysfunction to population data from normal hearing subjects. Overall, these findings provide important context for the widespread observations of visual- and eye-movement related signals found in cortical and subcortical auditory areas of the brain.

7.
Elife ; 112022 11 29.
Artigo em Inglês | MEDLINE | ID: mdl-36444983

RESUMO

Sensory receptive fields are large enough that they can contain more than one perceptible stimulus. How, then, can the brain encode information about each of the stimuli that may be present at a given moment? We recently showed that when more than one stimulus is present, single neurons can fluctuate between coding one vs. the other(s) across some time period, suggesting a form of neural multiplexing of different stimuli (Caruso et al., 2018). Here, we investigate (a) whether such coding fluctuations occur in early visual cortical areas; (b) how coding fluctuations are coordinated across the neural population; and (c) how coordinated coding fluctuations depend on the parsing of stimuli into separate vs. fused objects. We found coding fluctuations do occur in macaque V1 but only when the two stimuli form separate objects. Such separate objects evoked a novel pattern of V1 spike count ('noise') correlations involving distinct distributions of positive and negative values. This bimodal correlation pattern was most pronounced among pairs of neurons showing the strongest evidence for coding fluctuations or multiplexing. Whether a given pair of neurons exhibited positive or negative correlations depended on whether the two neurons both responded better to the same object or had different object preferences. Distinct distributions of spike count correlations based on stimulus preferences were also seen in V4 for separate objects but not when two stimuli fused to form one object. These findings suggest multiple objects evoke different response dynamics than those evoked by single stimuli, lending support to the multiplexing hypothesis and suggesting a means by which information about multiple objects can be preserved despite the apparent coarseness of sensory coding.


Assuntos
Córtex Visual , Animais , Neurônios , Macaca , Encéfalo
8.
Eur J Neurosci ; 55(2): 528-548, 2022 01.
Artigo em Inglês | MEDLINE | ID: mdl-34844286

RESUMO

How we distinguish multiple simultaneous stimuli is uncertain, particularly given that such stimuli sometimes recruit largely overlapping populations of neurons. One commonly proposed hypothesis is that the sharpness of tuning curves might change to limit the number of stimuli driving any given neuron when multiple stimuli are present. To test this hypothesis, we recorded the activity of neurons in the inferior colliculus while monkeys made saccades to either one or two simultaneous sounds differing in frequency and spatial location. Although monkeys easily distinguished simultaneous sounds (~90% correct performance), the frequency selectivity of inferior colliculus neurons on dual-sound trials did not improve in any obvious way. Frequency selectivity was degraded on dual-sound trials compared to single-sound trials: neural response functions broadened and frequency accounted for less of the variance in firing rate. These changes in neural firing led a maximum-likelihood decoder to perform worse on dual-sound trials than on single-sound trials. These results fail to support the hypothesis that changes in frequency response functions serve to reduce the overlap in the representation of simultaneous sounds. Instead, these results suggest that alternative possibilities, such as recent evidence of alternations in firing rate between the rates corresponding to each of the two stimuli, offer a more promising approach.


Assuntos
Colículos Inferiores , Localização de Som , Estimulação Acústica , Animais , Colículos Inferiores/fisiologia , Macaca mulatta , Som , Localização de Som/fisiologia
9.
Ann Appl Stat ; 15(1): 41-63, 2021 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-34413921

RESUMO

Conventional analysis of neuroscience data involves computing average neural activity over a group of trials and/or a period of time. This approach may be particularly problematic when assessing the response patterns of neurons to more than one simultaneously presented stimulus. in such cases the brain must represent each individual component of the stimuli bundle, but trial-and-time-pooled averaging methods are fundamentally unequipped to address the means by which multiitem representation occurs. We introduce and investigate a novel statistical analysis framework that relates the firing pattern of a single cell, exposed to a stimuli bundle, to the ensemble of its firing patterns under each constituent stimulus. Existing statistical tools focus on what may be called "first order stochasticity" in trial-to-trial variation in the form of unstructured noise around a fixed firing rate curve associated with a given stimulus. our analysis is based upon the theoretical premise that exposure to a stimuli bundle induces additional stochasticity in the cell's response pattern in the form of a stochastically varying recombination of its single stimulus firing rate curves. We discuss challenges to statistical estimation of such "second order stochasticity" and address them with a novel dynamic admixture point process (DAPP) model. DAPP is a hierarchical point process model that decomposes second order stochasticity into a Gaussian stochastic process and a random vector of interpretable features and facilitates borrowing of information on the latter across repeated trials through latent clustering. We illustrate the utility and accuracy of the DAPP analysis with synthetic data simulation studies. We present real-world evidence of second order stochastic variation with an analysis of monkey inferior colliculus recordings under auditory stimuli.

10.
Annu Rev Vis Sci ; 7: 201-223, 2021 09 15.
Artigo em Inglês | MEDLINE | ID: mdl-34242053

RESUMO

Coordination between different sensory systems is a necessary element of sensory processing. Where and how signals from different sense organs converge onto common neural circuitry have become topics of increasing interest in recent years. In this article, we focus specifically on visual-auditory interactions in areas of the mammalian brain that are commonly considered to be auditory in function. The auditory cortex and inferior colliculus are two key points of entry where visual signals reach the auditory pathway, and both contain visual- and/or eye movement-related signals in humans and other animals. The visual signals observed in these auditory structures reflect a mixture of visual modulation of auditory-evoked activity and visually driven responses that are selective for stimulus location or features. These key response attributes also appear in the classic visual pathway but may play a different role in the auditory pathway: to modify auditory rather than visual perception. Finally, while this review focuses on two particular areas of the auditory pathway where this question has been studied, robust descending as well as ascending connections within this pathway suggest that undiscovered visual signals may be present at other stages as well.


Assuntos
Córtex Auditivo , Animais , Córtex Auditivo/fisiologia , Mamíferos , Sensação , Órgãos dos Sentidos , Vias Visuais , Percepção Visual/fisiologia
11.
J Neurophysiol ; 126(1): 82-94, 2021 07 01.
Artigo em Inglês | MEDLINE | ID: mdl-33852803

RESUMO

Stimulus locations are detected differently by different sensory systems, but ultimately they yield similar percepts and behavioral responses. How the brain transcends initial differences to compute similar codes is unclear. We quantitatively compared the reference frames of two sensory modalities, vision and audition, across three interconnected brain areas involved in generating saccades, namely the frontal eye fields (FEF), lateral and medial parietal cortex (M/LIP), and superior colliculus (SC). We recorded from single neurons in head-restrained monkeys performing auditory- and visually guided saccades from variable initial fixation locations and evaluated whether their receptive fields were better described as eye-centered, head-centered, or hybrid (i.e. not anchored uniquely to head- or eye-orientation). We found a progression of reference frames across areas and across time, with considerable hybrid-ness and persistent differences between modalities during most epochs/brain regions. For both modalities, the SC was more eye-centered than the FEF, which in turn was more eye-centered than the predominantly hybrid M/LIP. In all three areas and temporal epochs from stimulus onset to movement, visual signals were more eye-centered than auditory signals. In the SC and FEF, auditory signals became more eye-centered at the time of the saccade than they were initially after stimulus onset, but only in the SC at the time of the saccade did the auditory signals become "predominantly" eye-centered. The results indicate that visual and auditory signals both undergo transformations, ultimately reaching the same final reference frame but via different dynamics across brain regions and time.NEW & NOTEWORTHY Models for visual-auditory integration posit that visual signals are eye-centered throughout the brain, whereas auditory signals are converted from head-centered to eye-centered coordinates. We show instead that both modalities largely employ hybrid reference frames: neither fully head- nor eye-centered. Across three hubs of the oculomotor network (intraparietal cortex, frontal eye field, and superior colliculus) visual and auditory signals evolve from hybrid to a common eye-centered format via different dynamics across brain areas and time.


Assuntos
Percepção Auditiva/fisiologia , Lobo Frontal/fisiologia , Lobo Parietal/fisiologia , Movimentos Sacádicos/fisiologia , Colículos Superiores/fisiologia , Percepção Visual/fisiologia , Estimulação Acústica/métodos , Animais , Macaca mulatta , Estimulação Luminosa/métodos , Fatores de Tempo
12.
J Neurophysiol ; 124(3): 715-727, 2020 09 01.
Artigo em Inglês | MEDLINE | ID: mdl-32727263

RESUMO

The environment is sampled by multiple senses, which are woven together to produce a unified perceptual state. However, optimally unifying such signals requires assigning particular signals to the same or different underlying objects or events. Many prior studies (especially in animals) have assumed fusion of cross-modal information, whereas recent work in humans has begun to probe the appropriateness of this assumption. Here we present results from a novel behavioral task in which both monkeys (Macaca mulatta) and humans localized visual and auditory stimuli and reported their perceived sources through saccadic eye movements. When the locations of visual and auditory stimuli were widely separated, subjects made two saccades, while when the two stimuli were presented at the same location they made only a single saccade. Intermediate levels of separation produced mixed response patterns: a single saccade to an intermediate position on some trials or separate saccades to both locations on others. The distribution of responses was well described by a hierarchical causal inference model that accurately predicted both the explicit "same vs. different" source judgments as well as biases in localization of the source(s) under each of these conditions. The results from this task are broadly consistent with prior work in humans across a wide variety of analogous tasks, extending the study of multisensory causal inference to nonhuman primates and to a natural behavioral task with both a categorical assay of the number of perceived sources and a continuous report of the perceived position of the stimuli.NEW & NOTEWORTHY We developed a novel behavioral paradigm for the study of multisensory causal inference in both humans and monkeys and found that both species make causal judgments in the same Bayes-optimal fashion. To our knowledge, this is the first demonstration of behavioral causal inference in animals, and this cross-species comparison lays the groundwork for future experiments using neuronal recording techniques that are impractical or impossible in human subjects.


Assuntos
Percepção Auditiva/fisiologia , Movimentos Sacádicos/fisiologia , Percepção Espacial/fisiologia , Pensamento/fisiologia , Percepção Visual/fisiologia , Adulto , Animais , Tecnologia de Rastreamento Ocular , Feminino , Humanos , Masculino , Localização de Som/fisiologia
13.
Artigo em Inglês | MEDLINE | ID: mdl-34505116

RESUMO

We recently reported the existence of fluctuations in neural signals that may permit neurons to code multiple simultaneous stimuli sequentially across time [1]. This required deploying a novel statistical approach to permit investigation of neural activity at the scale of individual trials. Here we present tests using synthetic data to assess the sensitivity and specificity of this analysis. We fabricated datasets to match each of several potential response patterns derived from single-stimulus response distributions. In particular, we simulated dual stimulus trial spike counts that reflected fluctuating mixtures of the single stimulus spike counts, stable intermediate averages, single stimulus winner-take-all, or response distributions that were outside the range defined by the single stimulus responses (such as summation or suppression). We then assessed how well the analysis recovered the correct response pattern as a function of the number of simulated trials and the difference between the simulated responses to each "stimulus" alone. We found excellent recovery of the mixture, intermediate, and outside categories (>97% correct), and good recovery of the single/winner-take-all category (>90% correct) when the number of trials was >20 and the single-stimulus response rates were 50Hz and 20Hz respectively. Both larger numbers of trials and greater separation between the single stimulus firing rates improved categorization accuracy. These results provide a benchmark, and guidelines for data collection, for use of this method to investigate coding of multiple items at the individual-trial time scale.

14.
J Eye Mov Res ; 12(7)2019 Nov 25.
Artigo em Inglês | MEDLINE | ID: mdl-33828768

RESUMO

Keynote by Jenny Groh (Duke University) at the 20th European Conference on Eye Movement Research (ECEM) in Alicante, 19.8.2019 Information about eye movements with respect to the head is required for reconciling visual and auditory space. This keynote presentation describes recent findings concerning how eye movements affect early auditory processing via motor processes in the ear (eye movement-related eardrum oscillations, or EMREOs). Computational efforts to understand how eye movements are factored in to auditory processing to produce a reference frame aligned with visual space uncovered a second critical issue: sound location is not mapped but is instead rate (meter) coded in the primate brain, unlike visual space. Meter coding would appear to limit the representation of multiple simultaneous sounds. The second part of this presentation concerns how such a meter code could use fluctuating activity patterns to circumvent this limitation Video stream https://vimeo.com/356576513.

15.
Nat Commun ; 9(1): 2715, 2018 07 13.
Artigo em Inglês | MEDLINE | ID: mdl-30006598

RESUMO

How the brain preserves information about multiple simultaneous items is poorly understood. We report that single neurons can represent multiple stimuli by interleaving signals across time. We record single units in an auditory region, the inferior colliculus, while monkeys localize 1 or 2 simultaneous sounds. During dual-sound trials, we find that some neurons fluctuate between firing rates observed for each single sound, either on a whole-trial or on a sub-trial timescale. These fluctuations are correlated in pairs of neurons, can be predicted by the state of local field potentials prior to sound onset, and, in one monkey, can predict which sound will be reported first. We find corroborating evidence of fluctuating activity patterns in a separate dataset involving responses of inferotemporal cortex neurons to multiple visual stimuli. Alternation between activity patterns corresponding to each of multiple items may therefore be a general strategy to enhance the brain processing capacity, potentially linking such disparate phenomena as variable neural firing, neural oscillations, and limits in attentional/memory capacity.


Assuntos
Potenciais de Ação/fisiologia , Córtex Auditivo/fisiologia , Percepção Auditiva/fisiologia , Colículos Inferiores/fisiologia , Neurônios/fisiologia , Estimulação Acústica , Animais , Atenção/fisiologia , Córtex Auditivo/citologia , Eletrodos Implantados , Feminino , Colículos Inferiores/citologia , Macaca mulatta , Neurônios/citologia , Análise de Célula Única , Som , Técnicas Estereotáxicas
16.
J Neurophysiol ; 119(4): 1411-1421, 2018 04 01.
Artigo em Inglês | MEDLINE | ID: mdl-29357464

RESUMO

We accurately perceive the visual scene despite moving our eyes ~3 times per second, an ability that requires incorporation of eye position and retinal information. In this study, we assessed how this neural computation unfolds across three interconnected structures: frontal eye fields (FEF), intraparietal cortex (LIP/MIP), and the superior colliculus (SC). Single-unit activity was assessed in head-restrained monkeys performing visually guided saccades from different initial fixations. As previously shown, the receptive fields of most LIP/MIP neurons shifted to novel positions on the retina for each eye position, and these locations were not clearly related to each other in either eye- or head-centered coordinates (defined as hybrid coordinates). In contrast, the receptive fields of most SC neurons were stable in eye-centered coordinates. In FEF, visual signals were intermediate between those patterns: around 60% were eye-centered, whereas the remainder showed changes in receptive field location, boundaries, or responsiveness that rendered the response patterns hybrid or occasionally head-centered. These results suggest that FEF may act as a transitional step in an evolution of coordinates between LIP/MIP and SC. The persistence across cortical areas of mixed representations that do not provide unequivocal location labels in a consistent reference frame has implications for how these representations must be read out. NEW & NOTEWORTHY How we perceive the world as stable using mobile retinas is poorly understood. We compared the stability of visual receptive fields across different fixation positions in three visuomotor regions. Irregular changes in receptive field position were ubiquitous in intraparietal cortex, evident but less common in the frontal eye fields, and negligible in the superior colliculus (SC), where receptive fields shifted reliably across fixations. Only the SC provides a stable labeled-line code for stimuli across saccades.


Assuntos
Eletroencefalografia/métodos , Fenômenos Eletrofisiológicos , Lobo Frontal/fisiologia , Lobo Parietal/fisiologia , Movimentos Sacádicos/fisiologia , Colículos Superiores/fisiologia , Percepção Visual/fisiologia , Animais , Macaca mulatta
17.
Proc Natl Acad Sci U S A ; 115(6): E1309-E1318, 2018 02 06.
Artigo em Inglês | MEDLINE | ID: mdl-29363603

RESUMO

Interactions between sensory pathways such as the visual and auditory systems are known to occur in the brain, but where they first occur is uncertain. Here, we show a multimodal interaction evident at the eardrum. Ear canal microphone measurements in humans (n = 19 ears in 16 subjects) and monkeys (n = 5 ears in three subjects) performing a saccadic eye movement task to visual targets indicated that the eardrum moves in conjunction with the eye movement. The eardrum motion was oscillatory and began as early as 10 ms before saccade onset in humans or with saccade onset in monkeys. These eardrum movements, which we dub eye movement-related eardrum oscillations (EMREOs), occurred in the absence of a sound stimulus. The amplitude and phase of the EMREOs depended on the direction and horizontal amplitude of the saccade. They lasted throughout the saccade and well into subsequent periods of steady fixation. We discuss the possibility that the mechanisms underlying EMREOs create eye movement-related binaural cues that may aid the brain in evaluating the relationship between visual and auditory stimulus locations as the eyes move.


Assuntos
Vias Auditivas/fisiologia , Encéfalo/fisiologia , Audição/fisiologia , Movimentos Sacádicos/fisiologia , Membrana Timpânica/fisiologia , Adolescente , Adulto , Animais , Feminino , Humanos , Macaca mulatta , Masculino , Estimulação Luminosa , Adulto Jovem
18.
J Neurosci ; 36(18): 5071-83, 2016 05 04.
Artigo em Inglês | MEDLINE | ID: mdl-27147659

RESUMO

UNLABELLED: Understanding the relationship between the auditory selectivity of neurons and their contribution to perception is critical to the design of effective auditory brain prosthetics. These prosthetics seek to mimic natural activity patterns to achieve desired perceptual outcomes. We measured the contribution of inferior colliculus (IC) sites to perception using combined recording and electrical stimulation. Monkeys performed a frequency-based discrimination task, reporting whether a probe sound was higher or lower in frequency than a reference sound. Stimulation pulses were paired with the probe sound on 50% of trials (0.5-80 µA, 100-300 Hz, n = 172 IC locations in 3 rhesus monkeys). Electrical stimulation tended to bias the animals' judgments in a fashion that was coarsely but significantly correlated with the best frequency of the stimulation site compared with the reference frequency used in the task. Although there was considerable variability in the effects of stimulation (including impairments in performance and shifts in performance away from the direction predicted based on the site's response properties), the results indicate that stimulation of the IC can evoke percepts correlated with the frequency-tuning properties of the IC. Consistent with the implications of recent human studies, the main avenue for improvement for the auditory midbrain implant suggested by our findings is to increase the number and spatial extent of electrodes, to increase the size of the region that can be electrically activated, and to provide a greater range of evoked percepts. SIGNIFICANCE STATEMENT: Patients with hearing loss stemming from causes that interrupt the auditory pathway after the cochlea need a brain prosthetic to restore hearing. Recently, prosthetic stimulation in the human inferior colliculus (IC) was evaluated in a clinical trial. Thus far, speech understanding was limited for the subjects and this limitation is thought to be partly due to challenges in harnessing the sound frequency representation in the IC. Here, we tested the effects of IC stimulation in monkeys trained to report the sound frequencies they heard. Our results indicate that the IC can be used to introduce a range of frequency percepts and suggest that placement of a greater number of electrode contacts may improve the effectiveness of such implants.


Assuntos
Implantes Cocleares , Discriminação Psicológica/fisiologia , Colículos Inferiores/fisiologia , Mesencéfalo/fisiologia , Estimulação Acústica , Algoritmos , Animais , Vias Auditivas/fisiologia , Comportamento Animal/fisiologia , Estimulação Elétrica , Eletrodos Implantados , Feminino , Macaca mulatta
19.
J Neurophysiol ; 115(6): 3162-73, 2016 06 01.
Artigo em Inglês | MEDLINE | ID: mdl-26936983

RESUMO

Saccadic eye movements can be elicited by more than one type of sensory stimulus. This implies substantial transformations of signals originating in different sense organs as they reach a common motor output pathway. In this study, we compared the prevalence and magnitude of auditory- and visually evoked activity in a structure implicated in oculomotor processing, the primate frontal eye fields (FEF). We recorded from 324 single neurons while 2 monkeys performed delayed saccades to visual or auditory targets. We found that 64% of FEF neurons were active on presentation of auditory targets and 87% were active during auditory-guided saccades, compared with 75 and 84% for visual targets and saccades. As saccade onset approached, the average level of population activity in the FEF became indistinguishable on visual and auditory trials. FEF activity was better correlated with the movement vector than with the target location for both modalities. In summary, the large proportion of auditory-responsive neurons in the FEF, the similarity between visual and auditory activity levels at the time of the saccade, and the strong correlation between the activity and the saccade vector suggest that auditory signals undergo tailoring to match roughly the strength of visual signals present in the FEF, facilitating accessing of a common motor output pathway.


Assuntos
Potenciais de Ação/fisiologia , Lobo Frontal/citologia , Neurônios/fisiologia , Movimentos Sacádicos , Campos Visuais/fisiologia , Estimulação Acústica , Análise de Variância , Animais , Feminino , Lobo Frontal/diagnóstico por imagem , Lobo Frontal/fisiologia , Macaca mulatta , Imageamento por Ressonância Magnética , Masculino , Estimulação Luminosa , Psicofísica , Tempo de Reação
20.
PLoS One ; 9(1): e85017, 2014.
Artigo em Inglês | MEDLINE | ID: mdl-24454779

RESUMO

Maps are a mainstay of visual, somatosensory, and motor coding in many species. However, auditory maps of space have not been reported in the primate brain. Instead, recent studies have suggested that sound location may be encoded via broadly responsive neurons whose firing rates vary roughly proportionately with sound azimuth. Within frontal space, maps and such rate codes involve different response patterns at the level of individual neurons. Maps consist of neurons exhibiting circumscribed receptive fields, whereas rate codes involve open-ended response patterns that peak in the periphery. This coding format discrepancy therefore poses a potential problem for brain regions responsible for representing both visual and auditory information. Here, we investigated the coding of auditory space in the primate superior colliculus(SC), a structure known to contain visual and oculomotor maps for guiding saccades. We report that, for visual stimuli, neurons showed circumscribed receptive fields consistent with a map, but for auditory stimuli, they had open-ended response patterns consistent with a rate or level-of-activity code for location. The discrepant response patterns were not segregated into different neural populations but occurred in the same neurons. We show that a read-out algorithm in which the site and level of SC activity both contribute to the computation of stimulus location is successful at evaluating the discrepant visual and auditory codes, and can account for subtle but systematic differences in the accuracy of auditory compared to visual saccades. This suggests that a given population of neurons can use different codes to support appropriate multimodal behavior.


Assuntos
Estimulação Acústica , Primatas/fisiologia , Colículos Superiores/fisiologia , Algoritmos , Animais
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...